Archive | EU RSS for this section

Cyberstalking and Online Platforms’ Due Diligence in the EU Digital Services Act

By Irene Kamara

Cyberstalking: a pattern of abusive online behaviours

Cyberstalking, the act of using electronic communication devices to create a criminal level of intimidation, harassment, and fear in one or more victims,[1] is a form of – usually gender-based- cyberviolence, with immense impacts on the physical and mental well-being of the victim. The Council of Europe Istanbul Convention on violence against women and children defines stalking as “the intentional conduct of repeatedly engaging in threatening conduct directed at another person, causing her or him to fear for her or his safety.”[2] The characteristic of cyberstalking is the repeated nature of the online harassment. It constitutes a pattern of behaviour, rather than one isolated incident.[3] Because of this aspect, while the victim may feel a continuous threat, classifying different events by a single or multiple offenders as one cyberstalking offence and prosecuting it, runs into several evidentiary obstacles. Such an evidentiary obstacle is that the victim needs to maintain records of the different events over the course of an extended period that amounts to the cyberstalking offence. Where punishable, cyberstalking usually falls under criminal law provisions of harassment, especially in jurisdictions that have signed and ratified the Istanbul Convention of the Council of Europe. However, regulatory approaches targeting the offender are not the only strategy to mitigate cyberstalking as a phenomenon. Online platforms such as social media platforms offer de facto a means that facilitates cyberstalking, since offenders use social media platforms to engage in unwanted communication such as threats against one or more victims or publicise defamatory or image-based abusive material. Several of the most popular platforms have adopted their own community standards on accepted behaviour. For example, Meta has a policy in place on bullying and harassment,[4] where inter alia the platform commits to “remove content that’s meant to degrade or shame, including, for example, claims about someone’s sexual activity.” Those policies however are largely voluntary measures, and their appropriateness is often not reviewed by external state actors, such as an independent supervisory authority.

Cyberstalking and the EU Digital Services Act

Since 2022, the EU has a new Regulation in place assigning a range of responsibilities to online platforms, such as Meta, to identify and take down illegal content including cyberstalking. The Digital Services Act (‘DSA’)[5] aims at providing harmonised EU rules for a “safe, predictable and trusted online environment”,[6] by inter alia establishing rules on due diligence obligations for providers of intermediary services. The DSA modernised some of the provisions of the 2000 e-Commerce Directive[7] and reinstated some others, such as the provision clarifying there is no general obligation for providers of intermediary services to monitor the information in their services, nor to engage into active fact-finding to establish whether an illegal activity takes place abusing their services.[8]

Despite the absence of a general monitoring obligation, providers of intermediary services are subject to several obligations in order to ensure the online safety and trust of the users of their services.[9] Those due diligence obligations, explained in the next section, are centered around the concept of illegal content. The DSA, defines in its Article 3(h) illegal content as “any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law.” The concept of content is thus very broad meaning any information, ‘products, services and activities’[10] and whether this content is illegal is determined by examining other EU or Member State law. Once thus information shared, publicized, transmitted, stored that is infringing EU or national Member State law, the due diligence framework established in the DSA is applicable to the service provider of intermediary services. Recital 12 DSA provides additional interpretational clarity as per the parameters and examples of illegal content, since applicable rules might render the content itself illegal or this might be rendered illegal because it relates to illegal activities. Examples include the sharing of image-based sexual abuse of children material (CSAM), hate speech or terrorist content, and online stalking (cyberstalking). As a result of this broad definition, even acts or information that are not as such illegal, but relate to the illegal activity of cyberstalking, would also qualify as illegal content, and would be subject to the DSA. This is an important step towards regulating cyberstalking, and essentially limiting the single acts of the cyberstalker causing nuisance or harassment to the victim(s) and other related targets of the offence, such as the friendly, family, or work environment of the victim(s).

The DSA due diligence framework: placing the responsibility on online platforms?

The e-Commerce Directive already provided an obligation for information society service providers to remove or disable access to information, when obtaining knowledge of an illegal activity.[11] The DSA develops a due diligence framework, which involves service providers undertaking action in a reactive manner (e.g. once a report is filed towards an online platform about an abusive image), but also in a proactive manner. The due diligence framework ensures that the service providers, and especially large online platforms, have assessed the systemic risks from the design and the functioning of their services.[12] The due diligence framework comprises of rules relating to transparency, cooperation with law enforcement and judicial authorities, and proactive measures against misuse of the offered services. In terms of proactive measures, very large online platforms must put in place mitigation measures tailored to systemic risks and adapt their moderation processes,  in particular in cases of cyberviolence, which includes cyberstalking. The risk of dissemination of CSAM is – according to Recital 80 DSA – one of the categories of such systemic risks. The mitigation measures include the expeditious removal or disabling access to the illegal content, and adapting the speed and quality of processing notices (Art. 35(1)(c) DSA). In terms of transparency, specifically for online platforms, the DSA imposes strict reporting rules as regards the use of automated moderation tools, including specification of error rates and applied safeguards,[13] but also detailed reporting of the number of suspensions of provision of services due to misuse.[14] As regards cooperation with law enforcement and judicial authorities, all hosting providers must notify the competent authorities of a suspicion that a criminal offence threatening an individual’s safety or life is taking place. The notification threshold is quite low, since Art. 18(1) DSA requires not proven illegal behaviour, but a suspicion that such a behaviour takes place. This means that in cases of cyberstalking, any act pointing the service provider at the direction of a potential of repeated threats directed towards an individual directly or indirectly via friends, family, or colleagues would require a report to the law enforcement authority.

Next steps

The DSA entered into force in 2022, but starts applying early 2024, since the EU legislator provided a grace period to service providers subject to the scope of the DSA to adapt to the new set of obligations. While it should be expected that hate speech, CSAM, and copyright infringing material, will -at the first period of the DSA application monopolise the focus of platforms and the related complaints and reports- the DSA will also be tested as a regulatory instrument against cyberstalking and the role of intermediaries, e.g. in this case online platforms, in combatting such an abusive online behaviour.


[1] Pittaro, M. L. (2007). Cyber stalking: An Analysis of Online Harassment and Intimidation. International Journal of Cyber Criminology, 1(2), 180–197. https://doi.org/10.5281/zenodo.18794

[2] Article 34 Council of Europe Convention on preventing and combating violence against women and domestic violence (‘Istanbul Convention’), Council of Europe Treaty Series No. 210.

[3] Vidal Verástegui, J., Romanosky, S., Blumenthal, M. S., Brothers, A., Adamson, D. M., Ligor, D. C., … & Schirmer, P. (2023). Cyberstalking: A Growing Challenge for the US Legal System.

[4] https://transparency.fb.com/policies/community-standards/bullying-harassment/

[5] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277, 27.10.2022, p. 1–102.

[6] Article 1(1) DSA.

[7] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) OJ L 178, 17.7.2000, p. 1–16.

[8] Read further on the prohibition of general monitoring obligations: Senftleben, Martin and Angelopoulos, Christina, The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market, Amsterdam/Cambridge, October 2020, https://ssrn.com/abstract=3717022

[9] Recital 41 DSA.

[10] Recital 12 DSA.

[11] Recital 46 e-commerce Directive.

[12] Article 34 DSA.

[13] Art. 15 DSA.

[14] Art. 24 DSA.

EU Adoption of DAC 8 – Mandatory Exchange of Information between Tax Authorities on Crypto Assets

By Amedeo Rizzo

On the 17th of October 2023, the Council of the European Union approved Directive DAC 8 on administrative cooperation (Press Release), introducing significant modifications related to the communication and automatic exchange of information regarding proceeds from operations in crypto-assets and information on advance tax rulings for high-net-worth individuals. With this directive, the EU, considering the new opportunities brought about by digitalization, aims to expand the scope of the obligation for automatic exchange of information, fostering a higher degree of administrative cooperation among tax administrations.

Crypto assets definition and tax problems

The term crypto asset refers to a digital representation of value that relies on a cryptographically secured distributed ledger to validate and secure transactions[1]. This mechanism establishes a tamper-resistant record of transactions within the asset without the need for a central authority. The challenge in categorizing assets within this broad class arises from ongoing innovation and the diverse range of services that specific assets can offer. Distinguishing these assets for tax purposes is complex due to these factors.

However, a fundamental tax-relevant dimension that aids in their characterization is the distinction between their use for investment purposes and as a means of payment. At one end of the spectrum are “security token,” which essentially serve as digital representations of traditional financial or other assets. An example includes “Non-fungible tokens” (NFTs), which are cryptographically protected representations of unique assets, such as works of art. Conversely, central bank digital currencies (CBDCs), might be considered to be more similar to fiat currency in digital form. While some national governments remain cautious about their adoption, the prevailing expectation is that the issuance of CBDCs will become widespread over time[2].

The primary impediment in the taxation of crypto assets stems from their inherent “anonymous” nature, wherein transactions employ public addresses that prove exceptionally challenging to associate with individuals or entities. This characteristic introduces a heightened susceptibility to tax evasion, placing the onus on tax authorities to address implementation challenges effectively.

When transactions occur through centralized exchanges, the challenge becomes more manageable as these exchanges can be subjected to standard know your customer (KYC) tracking rules and potential withholding taxes.

Background and content

On December 7, 2021, the Council, in its report to the European Council regarding tax matters, communicated its anticipation that the European Commission would present a legislative proposal in 2022 for the additional amendment of Directive 2011/16/EU on administrative cooperation in taxation (DAC).

This proposed amendment specifically pertained to the exchange of information regarding crypto-assets and tax rulings applicable to individuals with substantial wealth. According to the Council, it was imperative to fortify the stipulations of Directive 2011/16/EU pertaining to the information to be reported or exchanged to accommodate the evolving landscape of diverse markets and, consequently, to effectively address identified instances of tax fraud, tax evasion, and tax avoidance, by facilitating effective reporting and exchange of information.

In light of this objective, the Directive encompasses, among other aspects, the most recent revisions to the Common Reporting Standard (CRS) of the OECD. Notably, this includes the incorporation of provisions pertaining to electronic money and central bank digital currencies (CBDCs) delineated in Part II of the Crypto-Asset Reporting Framework and Amendments to the Common Reporting Standard, endorsed by the OECD on August 26, 2022.

Moreover, the Directive extends the purview of the automatic exchange of information concerning advance cross-border rulings to encompass specific rulings concerning individuals. In particular, it includes in the scope of the current regulation the rulings involving high-net-worth individuals, as well as provisions on automatic exchange of information on non-custodial dividends and similar revenues.

Additionally, the Directive enhances the regulations governing the reporting and communication of Tax Identification Numbers (TIN). The objective is to streamline the identification process for tax authorities, enabling them to accurately identify pertinent taxpayers and assess associated taxes. Additionally, the Directive seeks to modify provisions within the DAC concerning penalties imposed by Member States on individuals who fail to comply with national legislation related to reporting requirements established in accordance with the DAC.

This approach is adopted to ensure uniformity and coherence in the application of these provisions across Member States.

Problems addressed by the Directive

The bottom line of the DAC 8 revolves around the imperative of instituting mandatory reporting for crypto-asset service providers falling within the ambit of the Markets in Crypto-Assets (MiCA) Directive. Additionally, all other crypto-asset operators offering services to residents of the EU are required to comply. Non-EU operators must undergo registration in a Member State to adhere to DAC 8 regulations, ensuring the reporting of pertinent information. This strategic approach equips tax authorities of Member States with the requisite tools to monitor income generated from crypto assets by EU users and implement necessary measures to ensure tax compliance.

The reporting mechanism entails three sequential steps. Initially, crypto-asset service providers collect information of the transactions subject to reporting by their users. Subsequently, the providers submit the compiled information to the competent tax authority of their Member State (for EU providers) or the competent authority of the Member State of registration (for non-EU providers). Lastly, the competent tax authority transmits the reported information, inclusive of the TIN of the reported users, to the competent authorities of the users’ respective Member States of residence.

The Directive also emphasizes reporting requirements concerning reportable users and crypto assets. Reportable users are mandated to furnish their:

  • complete name;
  • address;
  • Member State of residence;
  • date and place of birth;
  • TIN.

Reportable crypto assets are to be identified by their complete name and the aggregate gross amount paid or the aggregate fair market value.

Reporting crypto-asset service providers are obligated to obtain a self-certification from users, encompassing information crucial for determining the user’s tax residence, such as full name, date of birth, residence address, and TIN. The proposal allows a substantial degree of discretion in evaluating the reliability of this self-certification, permitting providers to verify information using alternative sources, including their own customer due diligence procedures, in case of doubts. If a user accesses the platform through a Member State’s digital identity system, the provider is exempt from collecting certain information but is still required to obtain the user’s full name, the identification service used, and the Member State of issuance.

The Directive incorporates provisions facilitating the effective implementation of the proposed measures, including mechanisms for enforcing compliance by non-EU crypto-asset operators with EU resident users. In instances where non-EU operators fail to comply with reporting obligations due to a lack of registration in a Member State, the DAC 8 grants Member States the authority to employ effective, proportionate, and dissuasive measures to ensure compliance, potentially encompassing measures that may prohibit the operator from operating within the EU as a last resort (Article 8ad).

Conclusion

In summary, the recently approved DAC8 emerges as one of the needed responses to the evolving landscape of crypto assets, acknowledging some of the inherent challenges in taxation posed by their anonymous nature and the dynamic innovation within this domain.

By bridging the information gap and enhancing reporting mechanisms, DAC 8 empowers tax administrations to monitor and enforce compliance, thus mitigating some of the potential tax risks associated with crypto assets and tax rulings. The Directive, with its comprehensive approach and emphasis on international cooperation, is a critical step towards achieving transparency in the taxation of these emerging financial instruments.


[1] K. Baer, R. de Mooji, S. Hebous, M. Keen (2023). Taxing Cryptocurrencies, IMF WP/23/144.

[2] Ibid.

Large Language Models and the EU AI Act: the Risks from Stochastic Parrots and Hallucination

By Zihao Li[1]

With the launch of ChatGPT, Large Language Models (LLMs) are shaking up our whole society, rapidly altering the way we think, create and live. For instance, the GPT integration in Bing has altered our approach to online searching. While nascent LLMs have many advantages, new legal and ethical risks[2] are also emerging, stemming in particular from stochastic parrots and hallucination. The EU is the first and foremost jurisdiction that has focused on the regulation of AI models.[3] However, the risks posed by the new LLMs are likely to be underestimated by the emerging EU regulatory paradigm. Therefore, this correspondence warns that the European AI regulatory paradigm must evolve further to mitigate such risks.

Stochastic parrots and hallucination: unverified information generation

One potentially fatal flaw of the LLMs, exemplified by ChatGPT, is that the generation of information is unverified. For example, ChatGPT often generates pertinent, but non-existent academic reading lists. Data scientists claim that this problem is caused by “hallucination”[4] and “stochastic parrots”.[5] Hallucination occurs when LLMs generate text based on their internal logic or patterns, rather than the true context, leading to confidently but unjustified and unverified deceptive responses. Stochastic parrots is the repetition of training data or its patterns, rather than actual understanding or reasoning.

The text production method of LLMs is to reuse, reshape, and recombine the training data in new ways to answer new questions while ignoring the problem of authenticity and trustworthiness of the answers. In short, LLMs only predict the probability of a particular word coming next in a sequence, rather than actually comprehending its meaning. Although the majority of answers are high-quality and true, the content of the answers is fictional. Even though most training data is reliable and trustworthy, the essential issue is that the recombination of trustworthy data into new answers in a new context may lead to untrustworthiness, as the trustworthiness of information is conditional and often context-bound. If this precondition of trustworthy data disappears, trust in answers will be misplaced. Therefore, while the LLMs’ answers may seem highly relevant to the prompts, they are made-up.

However, merely improving the accuracy of the models through new data and algorithms is insufficient, because the more accurate the model is, the more users will rely on it, and thus be tempted not to verify the answers, leading to greater risk when stochastic parrots and hallucinations appear. This situation, where an increase in accuracy leads to higher reliance and potential risks, can be described as the ‘accuracy paradox’. The risk is beyond measure if users encounter these problems in especially sensitive areas such as healthcare or the legal field. Even if utilizing real-time internet sources, the trustworthiness of LLMs may remain compromised, as exemplified by factual errors in new Bing’s launch demo.

These risks can lead to ethical concerns, including misinformation and disinformation, which may adversely affect individuals through misunderstandings, erroneous decisions, loss of trust, and even physical harm (e.g., in healthcare). Misinformation and disinformation can reinforce bias,[6] as LLMs may perpetuate stereotypes present in their training data.[7]

The EU AI regulatory paradigm: Advanced Legal intervention required

The EU has already commenced putting effort into AI governance. The AI Act (AIA) is the first and globally most ambitious attempt to regulate AI. However, the proposed AIA, employing a risk-based taxonomy for AI regulation, encounters difficulties when applied to general-purpose LLMs. On the one hand, categorizing LLMs as high-risk AI due to its generality may impede EU AI development. On the other hand, if general-purpose LLMs are regarded as chatbots, falling within a limited-risk group, merely imposing transparency obligations (i.e., providers need to disclose that the answer is generated by AI) would be insufficient.[8] Because the danger of parroting and hallucination risks is not only related to whether users are clearly informed that they are interacting with AI, but also to the reliability and trustworthiness of LLMs’ answers, i.e., how users can distinguish between truth and made-up answers. When a superficially eloquent and knowledgeable chatbot generates unverified content with apparent confidence, users may trust the fictitious content without undertaking verification. Therefore, the AIA’s transparency obligation is not sufficient.

Additionally, the AIA does not fully address the role, rights, or responsibilities of the end-users. As a result, users have no chance to contest or complain about LLMs, especially when stochastic parrots and hallucination occur and affect their rights. Moreover, the AIA does not impose any obligations on users. However, as aforementioned, the occurrence of disinformation is largely due to deliberate misuse by users. Without imposing responsibilities on the user side, it is difficult to regulate the harmful use of AI by users. Meanwhile, it is argued that the logic of AIA is to work backward from certain harms to measures that mitigate the risk that these harms materialize.[9] The primary focus ought to shift towards the liability associated with the quality of input data, rather than imposing unattainable obligations on data quality.

Apart from the AIA, the Digital Service Act (DSA) aims to govern disinformation. However, the DSA’s legislators only focus on the responsibilities of the intermediary, overlooking the source of the disinformation. Imposing obligations only on intermediaries when LLMs are embedded in services is insufficient, as such regulation cannot reach the underlying developers of LLMs. Similarly, the Digital Markets Act (DMA) focuses on the regulation of gatekeepers, aiming to establish a fair and competitive market. Although scholars recently claim that the DMA has significant implications for AI regulation,[10] the DMA primarily targets the effects of AI on market structure; it can only provide limited help on LLMs. The problem that the DSA and DMA will face is that both only govern the platform, not the usage, performance, and output of AI per se. This regulatory approach is a consequence of the current platform-as-a-service (PaaS) business model. However, once the business model shifts to AI model-as-a-service (MaaS),[11] this regulatory framework is likely to become nugatory, as the platform does not fully control the processing logic and output of the algorithmic model.

Therefore, it is necessary to urgently reconsider the regulation of general-purpose LLMs.[12] The parroting and hallucination issues show that minimal transparency obligations are insufficient, since LLMs often lull users into misplaced trust. When using LLMs, users should be acutely aware that the answers are made-up, may be unreliable, and require verification. LLMs should be obliged to remind and guide users on content verification. Particularly when prompted with sensitive topics, such as medical or legal inquiries, LLMs should refuse to answer, instead directing users to authoritative sources with traceable context. The suitable scope for such filter and notice obligations warrants further discussion from legal, ethical and technical standpoints.

Furthermore, legislators should reassess the risk-based AI taxonomy in the AIA. The above discussion suggests that the effective regulation of LLMs needs to ensure their trustworthiness, taking into account the reliability, explainability and traceability of generated information, rather than solely focusing on transparency. Meanwhile, end-users, developers and deployers’ roles should all be considered in AI regulations, while shifting focus from PaaS to AI MaaS.


[1] The work is adapted and developed from the preprint version of a paper published in Nature Machine Intelligence, “Zihao Li, ‘Why the European AI Act Transparency Obligation Is Insufficient’ [2023] Nature Machine Intelligence. https://doi.org/10.1038/s42256-023-00672-y”

[2] ‘Much to Discuss in AI Ethics’ (2022) 4 Nature Machine Intelligence 1055.

[3] Zihao Li, ‘Why the European AI Act Transparency Obligation Is Insufficient’ [2023] Nature Machine Intelligence.

[4] Ziwei Ji and others, ‘Survey of Hallucination in Natural Language Generation’ [2022] ACM Computing Surveys 3571730.

[5] Emily M Bender and others, ‘On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?’, Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (ACM 2021) <https://dl.acm.org/doi/10.1145/3442188.3445922&gt; accessed 14 January 2023.

[6] Marvin van Bekkum and Frederik Zuiderveen Borgesius, ‘Using Sensitive Data to Prevent Discrimination by Artificial Intelligence: Does the GDPR Need a New Exception?’ (2023) 48 Computer Law & Security Review 105770.

[7] Zihao Li, ‘Affinity-Based Algorithmic Pricing: A Dilemma for EU Data Protection Law’ (2022) 46 Computer Law & Security Review 1.

[8] Lilian Edwards, ‘The EU AI Act: A Summary of Its Significance and Scope’ (Ada Lovelace Institute 2022) <https://www.adalovelaceinstitute.org/wp-content/uploads/2022/04/Expert-explainer-The-EU-AI-Act-11-April-2022.pdf&gt; accessed 17 January 2023.

[9] Martin Kretschmer and others, ‘The Risks of Risk-Based AI Regulation: Taking Liability Seriously’.

[10] Philipp Hacker, Johann Cordes and Janina Rochon, ‘Regulating Gatekeeper AI and Data: Transparency, Access, and Fairness under the DMA, the GDPR, and Beyond’ [2022] SSRN Electronic Journal <https://www.ssrn.com/abstract=4316944&gt; accessed 8 January 2023.

[11] Tianxiang Sun and others, ‘Black-Box Tuning for Language-Model-as-a-Service’, Proceedings of the 39th International Conference on Machine Learning (PMLR 2022) <https://proceedings.mlr.press/v162/sun22e.html&gt; accessed 10 February 2023.

[12] Philipp Hacker, Andreas Engel and Theresa List, ‘Understanding and Regulating ChatGPT, and Other Large Generative AI Models: With input from ChatGPT’ [2023] Verfassungsblog <https://verfassungsblog.de/chatgpt/&gt; accessed 20 May 2023.

The EU Foreign Subsidies Regulation: a Structural Change to the Internal Market

By Amedeo Rizzo

The EU Foreign Subsidies Regulation (“FSR”) has been published on the 14th of December 2022 and entered into force on 12 January 2023. The Regulation creates a new regime with the objective of protecting the internal market of the European Union from distortions created by foreign subsidies. In doing so, the FSR imposes an approval procedure for foreign subsidies to companies engaging in commercial activities in the EU and notification obligations for M&A activities of significant EU businesses and large EU public contracts.

The objective of the Regulation is to close an existing loophole in the internal market supervision, which was very restrictive towards EU state aid regulation but did not take into account possible distortions coming from non-EU countries. This is supposed to create a level playing field for all companies that operate in the EU, supervised by the European Commission, through investigatory powers, ex officio, and rights to implement measures to ensure compliance.

Foreign Subsidies covered by the Regulation

The FSR covers any form of contributions, direct or indirect, provided by non-EU governments or any public or private entity whose actions are attributable to the government of the non-EU country. Contributions could be distortive where they confer benefits that would not normally be available on the market EU company, and which are selective in the way they advantage one or more companies or industries as opposed to all companies or all companies active in a particular industry.

The notion of financial contributions under the FSR is a quite broad concept, including many forms of advantages. As provided in the Regulation, financial contributions include but are not limited to:

  • the transfer of funds/liabilities, such as capital injections, grants, loans, guarantees, tax incentives, the setting off of operating losses, compensation for financial burdens imposed by public authorities, debt forgiveness, debt to equity swaps or rescheduling;
  • the foregoing of revenue that is otherwise due, such as tax exemptions or the granting of special or exclusive rights without adequate remuneration; or
  • the provision of goods or services or the purchase of goods or services.

These kinds of benefits include zero- or low-interest loans, tax exemptions and reductions, state-funded R&D and other forms of intellectual property subsidization, government contracts and grants of exclusive rights without adequate remuneration.

The subjects that are limited in their ability to provide contributions to companies operating in the EU internal market are all the entities related to the non-EU country and therefore include:

  • the central government and public authorities at all other levels;
  • any foreign public entity whose actions can be attributed to the third country, taking into account elements such as the characteristics of the entity and the legal and economic environment prevailing in the State in which the entity operates, including the government’s role in the economy; or
  • any private entity whose actions can be attributed to the third country, taking into account all relevant circumstances.

Distortion of competition in the EU

One of the fundamental factors to trigger the FSR is that the foreign subsidy needs to potentially distort competition in the EU, meaning that it negatively affects it.

Distortions in the internal market are determined on the basis of indicators, which can include:

  • the amount of the foreign subsidy;
  • the nature of the foreign subsidy;
  • the situation of the undertaking, including its size and the markets or sectors concerned;
  • the level and evolution of the economic activity of the undertaking on the internal market;
  • the purpose and conditions attached to the foreign subsidy as well as its use on the internal market.

In general, the Commission seems to have quite an extensive distortionary power over the decision-making process of recognizing the negative effects. However, it will have to take into account also the positive effects on the market, which will burden the Commission with a balancing test.

The Regulation provides some dimension-related thresholds for financial contributions to what is likely to distort competition:

  • A subsidy that does not exceed the de minimis aid measures, contained in Regulation (EU) No 1407/2013 (EUR 200,000 per third country over any consecutive period of three years) shall not be considered distortive.
  • A subsidy that does not exceed EUR 4 million per undertaking over any consecutive period of three years is unlikely to cause distortions.
  • A subsidy that exceeds EUR 4 million is likely to cause distortions if it negatively affects competition in the EU.

The role of the European Commission

On its own initiative, the Commission may review a transaction or a public procurement ex-officioon the grounds of information received by any source or notifications of potentially subsidized M&A transactions or public procurement bids. If the Commission finds sufficient evidence concerning the existence of a distortive subsidy, it carries out a preliminary review.

When this procedure leads to enough evidence of the foreign distortive subsidy, the Commission initiates an in-depth investigation. When a foreign distortive subsidy is identified, the Commission can impose redressive measures or accept commitments.

The non-exhausting list of redressive measures includes the reduction of capacity or market presence of the subsidized entity, the refraining from certain investments, and the repayment of the foreign subsidy.

The recipient of the subsidy may offer commitments and, for instance, pay back the subsidy. The Commission may accept commitments if considers them to be full and effective remedies to the distortion.

A separate mechanism of market investigations allows the Commission to investigate a particular business sector, a type of economic activity or a subsidy if there is reasonable suspicion. In its surveillance activities, the Commission can conduct a request for information that entities or their associations provide certain information, irrespective of whether they are subject to an investigation.

To block damaging activities the Commission can impose interim measures. Additionally, it is authorized to impose fines on the entities for breaching procedural requirements or not providing information. The fines can reach 1% of the aggregate turnover or 5% of the average daily aggregate turnover for each day of the violation, calculated on the previous year’s data. Fines can go up to 10% of the turnover when companies fail to notify a transaction or a subsidy granted during a public procurement procedure, implement a notified concentration before the end of the review period, or try to circumvent the notification requirements.

Conclusion

This measure constitutes a paramount change in the EU approach to competition in the internal market. It will become important to see how much the Commission is going to use this new instrument, and the way it is going to assess market distortions on a case-by-case basis, as there is probably going to be a delicate equilibrium with trade legislation and possible countervailing measures.

It is important for companies that operate in the EU that have received these kinds of financial contributions from non-EU countries to quickly prepare to apply this new Regulation. Perhaps some groups that can fall in this situation might want to reform their internal processes to collect information, understand reporting requirements and preparing justifications or notifications to the EU.

New EU rules for a Common Charger for Electronic Devices

By Olia Kanevskaia

On November 23, 2022, the European Union (“EU”) adopted the Directive 2022/2380 that mandates a common, EU-wide charger for electronic equipment (“the Common Charger Directive”).[1] The Directive prescribes the USB Type-C port as a mandatory standard for wired charging for a range of devices.

The new law amends the Radio Equipment Directive that established a framework for placing of radio and telecommunications equipment on the EU markets.[2] The legislation was passed after the Council’s and European Parliaments’ approval of the European Commission’s proposal that was introduced in September 2021, and is in force as of December 2022. The EU Member States are required to transpose the Common Charger Directive into their national laws by December 28, 2023.

Objectives of the Directive

The new Directive mainly pursues two objectives: 1) the economic objective of the EU internal market and 2) the EU environmental objectives of reducing CO2 emissions and electronic waster. The economic objective is prevailing, since the legal basis for the Common Charger Directive is harmonization for the purpose of the proper functioning of the EU internal market.[3]

The new Directive harmonizes the EU-wide communication protocols and interfaces for wired chargers used among others, in mobile phones, keyboards and laptops. As follows from the recitals, fragmentation of the EU market for radio equipment due to different national regulations and practices risks affecting cross-border trade and brings into jeopardy the functioning of the EU internal market.[4]

Furthermore, the Common Charger Directive aims to reduce electronic waste and greenhouse gas emissions that are the result of production and disposal of different electronic chargers. The Directive thus fits among the recent EU legislative initiatives aiming to boost circular economy.[5]

While the Directive does not explicitly list “consumer protection” as one of its objectives, it makes frequent references to consumer benefits and convenience from a common charger.

Wired charging standards

Electronic devices can be charged through cables or wires that are plugged into the device from one side, and into the power outlet from another. Connectors for wired chargers are not harmonized across different categories of radio and telecommunications equipment: while most devices support USB Type-C connectors, iPhones famously run on the thunderbolt lightening cable. The type of connector – in other words, a standard for wired charging, – is typically determined by the market, rather than by law.

The USB standards are developed by the USB Implementers Forum – a global non-profit organization dedicated to the making, testing and promoting USB technologies. The USB Type-C standard is also endorsed as an international standard by the International Electrotechnical Commission (“IEC”) and transposed into a European standard by the European Committee for Electrotechnical Standardization (CENELEC).[6] The USB Type-C standard is widely used for different types of devices; yet, this standard, and its international and European implementations, in principle remains voluntary.[7]

The EU has been restating the importance of compatibility between wired chargers for quite a while, but until recently, it was mainly relying on the industry to agree on common rules. In 2009, fourteen major phone manufacturers, including Samsung, LG and Apple, signed a voluntary commitment to develop a common charging solution in a form of Memorandum of Understanding (“MoU”).[8] While since that time, many devices have indeed adopted Micro USB or, later, the USB Type-C standard as a wired charger connector, the MoU still allowed for the existence of proprietary charging interfaces like Apple’s thunderbolt. Attempts in European standardization committees to agree on a common connector seemed to have reached an impasse, and the voluntary approach resulted in many frustrations for the European legislator and consumer alike. 

Upon the expiration of the MoU in 2014, the European Commission launched two impact assessment studies assessing the potential for implementing a common solution for wired charges, followed by a resolution on a common charger for mobile radio equipment adopted by the European Parliament in 2020.[9] This eventually led to the Commission’s proposal to amend the Radio Equipment Directive and to mandate the USB Type-C standard as EU-wide standard for electronic devices. Similar requirements may be adopted for wireless chargers in the near future.[10]

Key requirements of the new Directive

Article 3 of the Directive mandates a USB Type-C charger for a list of electronic equipment, including mobile phones, tables, headsets, keyboard, e-readers and laptops.[11] This means that the devices should be manufactured already with a USB-C connector to be legally marketed in the EU. The European Commission reserves the right to amend the list of equipment that has to comply with the USB Type-C charger in the light of scientific and technological progress or market developments. The listed equipment should comply with the mandated wired charging requirement by December 28, 2024; for laptops, this deadline is April 28, 2026.

The Commission may further adopt rules for charging interfaces and communications protocols for equipment that can be charged by means other than wired charging. This includes requesting the European standardization organizations to develop harmonized standards for charging interfaces and communications protocols for such equipment. Harmonized standards are voluntary, but compliance with them grants presumption of compliance with European legislation.

When adopting or amending the rules for equipment charged by either wired or other means of charging, the European Commission should take into account the market acceptance of technologies under consideration, consumer convenience, and the reduction of environmental waste and market fragmentation. According to Article 3 (4) of the Directive, these objectives are presumed to be met by technical specifications that are based on relevant available international or European standards. The Directive, however, does not explain what it means by “being based on” and “relevant” or “available” standards. If such standards do not exist, or if the Commission determines that they do not meet the required objectives in an optimal manner, the Commission may develop its own technical specifications: this is in line with the Commission’s power to develop “common specifications” under the new legislation that heavily relies on harmonized standards.[12]

Furthermore, consumers should also be able to purchase electronic equipment without any charging device,[13] provided that the economic operators clearly indicate on a label whether or not the charger is included.[14] The Commission will monitor the extent to which this “unbundling” of charging devices from the radio equipment needs to be made mandatory.[15]

Outlook

The new Directive was met with enthusiasm by consumers, who will not need to purchase a new charger every time they buy a new electronic device. This will also reduce switching costs and prevent consumer lock-in in particular technologies or equipment. The disposal of wired chargers is also likely to be reduced, contributing to the EU’s environmental goals.

In turn, the requirement of a mandated standard for wired chargers does not sit well with some equipment manufacturers. For Apple, the new law means re-designing their products to comply with the EU legal requirements. Furthermore, many companies oppose the approach of standards and technologies mandated “top down”, since the technology selection typically occurs through industry rather than legislature.  The danger is that while pursuing the objective to achieve greater interoperability, the EU will use this Directive as a precedent to intervene in market processes and by this means, will stifle innovation and technological advancement.


[1] Directive (EU) 2022/2380 of the European Parliament and of the Council of 23 November 2022 amending Directive 2014/53/EU on the harmonisation of the laws of the Member States relating to the making available on the market of radio equipment, OJ L 315

[2] Directive 2014/53/EU of the European Parliament and of the Council of 16 April 2014 on the harmonization of the laws of the member States relating to the making available on the market of radio equipment and repealing Directive 1999/5/EC, OJ L 153

[3] Article 114 TFEU

[4] Recitals 7 and 8 Common Charger Directive

[5] Recital 3 Common Charger Directive

[6] European Standard EN IEC 62680-1-3:2021 ‘Universal serial bus interfaces for data and power – Part 1-3: Common components – USB Type-C® Cable and Connector Specification’

[7] Case C-613/14, James Elliott Construction Ltd v. Irish Asphalt Ltd [2006] para 53

[8] MoU regarding Harmonisation of a Charging Capability for Mobile Phones (June 5th, 2009)

[9] European Parliament resolution of 30 January 2020 on a common charger for mobile radio equipment (2019/2983(RSP)) OJ C 331

[10] Recital 13 Common Charger Directive

[11] Article 3(4) and Annex Ia Part I Common Charger Directive

[12] See, for example, Article 41 of the Communication (COM)2021 206 final from the Commission of 21 April 2021 on a Proposal for a Regulation of the European Parliament and of the Council laying down harmonized rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts

[13] Article 3a Common Charger Directive

[14] Articles 10(8) 12(4) and 13(4) Common Charger Directive

[15] Article 47 Common Charger Directive

A Legal-Technical Basis for a Computational Transatlantic Trade and Investment Partnership (TTIP) Agreement

By Craig Atkinson

With the emergence of new modes of governance, this article[1] specifies a legal-technical basis – background, analytical structure, sources, methods, and research questions – to advance the notion of a ‘computable’ transatlantic trade agreement.

Background

Negotiations for a Transatlantic Trade and Investment Partnership (TTIP)[2] agreement between the European Union (EU) and the United States (US) began in 2013 and ended without conclusion in 2016. By April 2019, the EU had rendered its negotiating directives “obsolete and no longer relevant.”[3] While no agreement was finalized, terms under the TTIP ‘version 1.0’ were expected to add €120 billion to the output of the EU, €90 billion to the US economy, and €100 billion to the world economy.[4] Now, the stakes associated with EU-US cooperation are even higher: cross-border data flows[5] have become a greater driver / enabler of international commercial activity, the digitalization[6] of trade has accelerated, and the global ‘digital economy’ continues to expand.[7]

Re-connecting for ‘Digital Cooperation’: The Trade and Technology Council (TTC)

To re-engage and coordinate responses, the EU-US Trade and Technology Council (TTC)[8] was established in 2021 and seeks to enhance bilateral relations by, inter alia, mitigating technical barriers between the jurisdiction(s),[9] strengthening transatlantic supply chains, fostering cooperation on certain data issues,[10] setting standards, promoting digital tools for small business inclusion, and mutually reforming the rules-based multilateral trading system. With limited progress at the World Trade Organization (WTO), negotiations in other fora have achieved some success in devising wholly new frameworks, dedicated chapters in trade agreements, and specific provisions to bridge ‘analog-to-digital’ gaps.[11]

Yet, in identifying and attempting to reconcile policy differences via a thematic Working Group (WG) model,[12] TTC statements to “update the rules for the 21st century economy”[13] are not binding commitments. In lieu of a formal, comprehensive, and modern EU-US trade agreement,[14] maintenance of the status quo is both a risk and an opportunity cost.[15]

Enter: Applied Computational Law

Concurrently, applications of Computational Law (CompLaw)[16] are emerging that allow for the expression and online publication of digital versions of rules[17] as algorithms[18] to improve accessibility[19] for humans and support operationalization[20] by machines. Computational Law is that branch of legal informatics concerned with “the mechanization of legal analysis” and “the codification of regulations in precise, computable form.”[21] The field is loosely defined by, often interrelated, modelling techniques and associated sub-branches, including ‘Big Data Law’[22] analytics and ‘Algorithmic Law’[23] efforts to express the logic of rules as computable proxies.[24] With the potential to assist human decision-making[25] (e.g., through legal expert systems)[26] and process automation (e.g., via compliance automation systems), Computational Law may also address private rights and obligations: computable contracts,[27] financial rules, and ‘business rules’ (e.g., inventory, pricing, etc.).

Analytical Structure and Sources[28]

As instruments begin to refer to governance[29] for, of, and by information and communications technology (ICT),[30] public and private branches of law can be used to construct a five-point legal-technical basis for a TTIP ‘version 2.0’[31] with computational rules (and data sources) in parallel to its natural language, other texts, and associated systems:[32]

  • First, by providing a ‘chapeau’ of concepts and methods, it is possible to describe the nature of EU-US relations in the age of Computational Law and the Internet.
  • Second, the identification of sources of public international law[33] – the WTO agreements, ongoing negotiations, plurilateral Joint Initiative (JI)[34] on E-commerce proposals, and legal instruments of the World Customs Organization (WCO) – assists in portraying the ‘multilateral interface’ for digital trade.[35]
  • Third, to complement the scope of the TTC, it is necessary to compare existing and envisaged sources of EU and US trade, business, technology, and privacy law. This includes the many EU ‘digital policy’ initiatives.[36]
  • Fourth, as discoverable in whole or in part in international agreements, legislation, regulations, and private contracts, it is essential to frame the institutional sources of ‘transnational commercial law’:[37] the principles, conventions, and model laws of the United Nations Commission on International Trade Law (UNCITRAL) and the International Institute for the Unification of Private Law (UNIDROIT). Relevant instruments of the Hague Conference on Private International Law (HCCH) and the International Chamber of Commerce (ICC) must also be considered.
  • Fifth, because ‘de facto’ and ‘de jure’ standards[38] facilitate the development of digital infrastructure, their recognition and classification present technical means to ‘seize the CompLaw opportunity’ for transatlantic trade.

Methods and Research Questions

Drawing from regime theory,[39] accounting for Commercial Law Intersections (CLIs),[40] and recognizing interplay with ‘constitutional’[41] and administrative law, the analytical structure may be employed to answer two questions:

  1. Which sources contain rules that may be appropriate[42] for algorithmic representation?
  2. How do these and other sources inform the legal environment for transatlantic digital trade?

Ultimately, by taking a comparative ‘Law + Technology’[43] approach to involve different legal subjects[44] and branches, it is feasible to hypothesize the composability[45] of hard and soft-law[46] to realize commercial activity under a ‘born digital’ transatlantic trade agreement.[47] Building on works in other jurisdictional contexts – transpacific[48] and pan-Africa[49] – outputs of the specified analytical structure are set to contribute to the advancement of legal informatics at the nexus of EU-US trade and technology policy regimes.


[1] Based on the introduction to the forthcoming TTLF Working Paper, A Transatlantic Trade and Investment Partnership ‘version 2.0’? International Commercial Rules in the Age of Computational Law.

[2] See EU negotiating texts in TTIP, European Commission, https://policy.trade.ec.europa.eu/eu-trade-relationships-country-and-region/countries-and-regions/united-states/eu-negotiating-texts-ttip_en.

[3] To pursue more limited and specific tariff negotiations on industrial goods, see Council Decision 6052/19, Authorising the opening of negotiations with the United States of America for an agreement on the elimination of tariffs for industrial goods, 2019.

[4] Gross Domestic Product (GDP). These and other estimates are subject to conjecture. See Werner Raza et al., ASSESS_TTIP: Assessing the Claimed Benefits of the Transatlantic Trade and Investment Partnership 1-5 (Österreichische Forschungsstiftung für Internationale Entwicklung – ÖFSE Oct. 2014).

[5] See Mira Burri, Data Flows versus Data Protection: Mapping Existing Reconciliation Models in Global Trade Law, in Law and Economics of Regulation 129 (Klaus Mathis & Avishalom Tor eds., Springer International Publishing 2021). See also OECD, Cross-Border Data Flows: Taking Stock of Key Policies and Initiatives (Dec. 2022). See further Javier López González et al., A Preliminary Mapping of Data Localisation Measures, OECD Trade Policy Papers (OECD Publishing 2022).

[6] The phase of ‘digital transformation’ that refers to process improvement(s). See Peter C. Verhoef et al., Digital Transformation: A Multidisciplinary Reflection and Research Agenda, 122 Journal of Business Research 889 (Jan. 2021).

[7] Amid expansion, EU-US digital trade flows are the “world’s most extensive”, yet differing policy stances (e.g., on data protection) caused the TTIP ‘version 1.0’ negotiations to fail. See Emily Jones et al., The UK and Digital Trade: Which Way Forward? (Oxford University Blavatnik School of Government Feb. 2021).

[8] See EU-US Trade and Technology Council Inaugural Joint Statement, European Commission (Sept. 29, 2021), https://ec.europa.eu/commission/presscorner/detail/e%20n/statement_21_4951.

[9] Considering the potential for barriers within and across the supranational EU; the national and sub-national systems of EU Member States; and the US federal / ‘state’ system.

[10] In the 1980s, the US was the first jurisdiction to ‘govern’ data flows. See Susan A. Aaronson, The Digital Trade Imbalance and Its Implications for Internet Governance, Global Commission on Internet Governance (Feb. 2016). More recently, the EU and US have included varying language on data governance issues in bilateral / regional trade agreements, see Mira Burri, Digital Trade: In Search of Appropriate Regulation, in Justice, trade, security, and individual freedoms in the digital society 213 (Fernando Esteban de la Rosa et al. eds., Thomson Reuters Sep. 2021). See also Neha Mishra, Building Bridges: International Trade Law, Internet Governance, and the Regulation of Data Flows, 52 Vand. J. Transnat’l L. 463 (2019).

[11] See Mira Burri & Thomas Cottier, Introduction: Digital technologies and international trade regulationin Trade Governance in the Digital Age: World Trade Forum 1–14 (2012).

[12] As a theme, transatlantic transfers of personal data fall outside of the scope of the TTC and have been negotiated separately under the ‘EU-US Data Privacy Framework’ (DPF). See Opinion 5/2023 on the European Commission Draft Implementing Decision on the Adequate Protection of Personal Data under the EU-US Data Privacy Framework (European Data Protection Board Feb. 2023). See also Hendrik Mildebrath, Reaching the EU-US Data Privacy Framework: First Reactions to Executive Order 14086, No. PE 739.261 (European Parliamentary Research Service – EPRS Dec. 2022).

[13] See U.S.-EU Establish Common Principles to Update the Rules for the 21st Century Economy at Inaugural Trade and Technology Council Meeting, The White House (Sept. 29, 2021), https://www.whitehouse.gov/briefing-room/statements-releases/2021/09/29/fact-sheet-u-s-eu-establish-common-principles-to-update-the-rules-for-the-21st-century-economy-at-inaugural-trade-and-technology-council-meeting.

[14] The scope of ‘modern’ trade agreements has expanded to cover new rules and their harmonization (e.g., data, intellectual property, health and safety, etc.). See Dani Rodrik, What Do Trade Agreements Really Do?, 32 Journal of Economic Perspectives 73 (Jan. 2018).

[15] On the perils of several meanings of fragmentation (e.g., legal/regulatory and technical), see Simon J. Evenett & Johannes Fritz, Emergent Digital Fragmentation: The Perils of Unilateralism – A Joint Report of the Digital Policy Alert and Global Trade Alert (CEPR Press 2022). See also ICC 2023 Trade Report: A Fragmenting World (ICC Apr. 2023). See further Panthea Pourmalek et al., As Digital Trade Expands, Data Governance Fragments, Centre for International Governance Innovation – CIGI (Feb. 9, 2023), https://www.cigionline.org/articles/as-digital-trade-expands-data-governance-fragments. In the context of supply chains, see Rebecca Harding, “Fragmentation”, Trade, and Supply Chain Resilience, Rebeccanomics (Apr. 17, 2023), https://rebeccanomics.com/rebeccas-blog/f/%E2%80%9Cfragmentation%E2%80%9D-trade-and-supply-chain-resilience.

[16] As first described in 2005 by Stanford University’s Nathaniel Love and Michael Genesereth in their seminal conference paper, see Nathaniel Love & Michael Genesereth, Computational Law, Proceedings of the 10th international conference on Artificial intelligence and law – ICAIL ‘05 205 (ACM Press 2005).

[17] See Ronald G. Ross, Rules: Shaping Behavior and Knowledge (Business Rule Solutions, LLC 1st ed. Jan. 2023).

[18] See Robert Kowalski, Algorithm = Logic + Control, 22 Communications of the Acm 424 (July 1979). See further Joseph Potvin, Data With Direction: Design Research Leading to a System Specification For ‘An Internet of Rules’ (Université du Québec en Outaouais 2023). In this form, ‘Rules as Data’ supplement normative expressions in natural languages and, while possibly ‘de jure’, are not to be considered as ‘law’ per se.

[19] Accessibility implies both access and capability (e.g., to understand and/or utilize data/information).

[20] The meanings of operationalization and application vary by discipline (e.g., law, computer science, etc.). See Meng Weng Wong, Rules as Code – Seven Levels of Digitisation (Singapore Management University Yong Pung How School of Law Apr. 2020).

[21] See Michael Genesereth, Computational Law: The Cop in the Backseat, CodeX — The Stan. Ctr. for Legal Informatics (2015), http://logic.stanford.edu/publications/genesereth/complaw.pdf. See also Michael Genesereth, What is Computational Law? CodeX — The Stan. Ctr. for Legal Informatics (Mar. 10, 2021), https://law.stanford.edu/2021/03/10/what-is-computational-law.

[22] Concerned with, “data-driven approaches to legal analysis… legal scholarship that leverages big data analytics—specifically, advances in statistical artificial intelligence, including machine learning, natural language processing, and deep learning—to identify patterns in legal information, to draw conclusions, to make policy recommendations, and to predict legal outcomes.” See Roland Vogl, Introduction to the Research Handbook on Big Data Law, in Research Handbook on Big Data Law 1–8 (Edward Elgar Publishing 2021).

[23] These approaches involve “transforming legislation and other legal sources into algorithms,” see Dag Wiese Schartum, From Algorithmic Law to Automation-Friendly Legislation, Computers & Law (Society for Computers and Law Aug. 2016), https://www.scl.org/articles/3716-from-algorithmic-law-to-automation-friendly-legislation. See also Megan Ma, Story of a Legal Codex(t) Writing Law in Code (École de Droit de Sciences Po 2021). This scholarship does not assume a ‘code’ or ‘programming language for the law’ based approach.

[24] Similarly bifurcated by Mireille Hildebrandt as ‘data-driven’ and ‘code-driven’. See Data-driven ‘law’, CoHuBiCoL, https://www.cohubicol.com/about/data-driven-law. See also Code-driven ‘law’, CoHuBiCoL, https://www.cohubicol.com/about/code-driven-law. Such categorizations are solely for the purposes of comparison and many approaches involve a ‘hybrid’ of techniques. See L. Thorne McCarty, A Language for Legal Discourse is All You Need, MIT Computational Law Report (2022), https://bit.ly/3ewZzh1. See also Bridging the Gap between Machine Learning and Logical Rules in Computational Legal Studies (Mar. 2022), https://youtu.be/rBPadM9tyNo. The use of the word ‘proxy’ is in place of any dominant way to describe the models, expressions, representations, etc. of natural language rules in computable form.

[25] Where possible (i.e., when not referring to a particular legal text or jurisdiction-specific jargon), this scholarship consciously avoids the term ‘automated decision-making’ (ADM) and considers that only humans can make informed ‘decisions’ and consent to action/inaction (i.e., subject to audit of any algorithm’s logic and control components).

[26] See, e.g., Richard E. Susskind, Expert Systems in Law: A Jurisprudential Approach to Artificial Intelligence and Legal Reasoning, 49 Mod. L. Rev. 168 (Mar. 1986).

[27] See Harry Surden, Computable Contracts, 46 U.C. Davis L. Rev. 72 (2012). See also Smart Legal Contracts: Computable Law in Theory and Practice (Jason Allen & Peter Hunn eds., Oxford University Press 1st ed. Apr. 2022).

[28] Sources of law are recognized by jurisdiction and under international law by the International Court of Justice (ICJ). See Statute of the International Court of Justice, art. 38, ¶ 1, concluded at San Francisco June 26, 1945, entered into force Oct. 24, 1945, T.S. 993. Although there is no consensus on the definition of a ‘rule’, it is generally understood that legal texts (e.g., treaties, legislation, regulations, case law, and contracts) are the source of norms, rules, and guidelines. See LegalRuleML Core Specification Version 1.0 (Organization for the Advancement of Structured Information Standards – OASIS Aug. 2021), https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/legalruleml-core-spec-v1.0.html.

[29] Broadly, governance refers to, “making decisions and exercising authority to guide the behaviour of individuals and organizations. Governance is commonly achieved by the creation and enforcement of explicit rules… less explicit social norms, guidelines, policies, or the creation of defined command structures.” See Agile Governance: Reimagining Policy-making in the Fourth Industrial Revolution 16 (World Economic Forum Jan. 2018).

[30] Here, for refers to status (e.g., legal recognition of electronic documents), of relates to limitation (e.g., data protection regulations), and by implies operationalization (e.g., via the systems of governments and/or private individuals/entities). See Governance Innovation: Redesigning Law and Architecture for Society 5.0, Ministry of Econ., Trade & Industry (Meti), https://www.meti.go.jp/press/2020/07/20200713001/20200713001-2.pdf (Japan).

[31] The TTLF Working Paper also exists as a ‘living’ GitHub project. See TTIPv2, https://github.com/lexmerca/TTIPv2_ToC.

[32] This includes a variety of ‘systems’ used in trade and commerce. For Customs, the EU and the US are pursuing modernization through ‘single window’ systems. See Recommendation and Guidelines on Establishing a Single Window to Enhance the Efficient Exchange of Information between Trade and Government: Recommendation No. 33 (United Nations Centre for Trade Facilitation and Electronic Business 2005). In the EU, see Parliament and Council Regulation 2022/2399, Establishing the European Union Single Window Environment for Customs, 2022 O.J. (L 317), 1. In the US, the single window for trade is the ‘Automated Commercial Environment’, see ACE Portal Modernization, US Customs and Border Protection, https://www.cbp.gov/trade/automated/ace-portal-modernization.

[33] Typically concerned with, “the relations of states, and states and state-created international organizations, and increasingly states and individuals. The source of law here is mostly comprised of treaties and custom…” See Volume I: The Foundations of Transnational Law (Hofstra University School of Law 2012). See also Alan. O. Sykes, The Inaugural Robert A. Kindler Professorship of Law Lecture: When is International Law Useful?, 45 N.Y.U. J. Int’l L. & Pol. (Mar. 2013), https://law.stanford.edu/publications/the-inaugural-robert-a-kindler-professorship-of-law-lecture-when-is-international-law-useful.

[34] Formerly known as the ‘Joint Statement Initiative’ (JSI) on E-commerce.

[35] Defined by the Organisation for Economic Cooperation (OECD)-WTO-International Monetary Fund (IMF) as trade that is ‘digitally ordered’ and/or ‘digitally delivered’, where digitally ordered is, “the international sale or purchase of a good or service, conducted over computer networks by methods specifically designed for the purpose of received or placing orders” and digitally delivered reflects “international transactions that are delivered remotely in an electronic format, using computer networks specifically designed for the purpose.” See Handbook on Measuring Digital Trade (OECD-WTO-IMF 2020). Under the WTO system, see Robert Staiger, Does Digital Trade Change the Purpose of a Trade Agreement?, No. w29578 (National Bureau of Economic Research Dec. 2021).

[36] For example, the EU electronic IDentification, Authentication and trust Services (eIDAS) regulation, the Digital Markets Act (DMA), the Digital Services Act (DSA), the Data Governance Act (DGA), and the Data Act.

[37] Here, transnational commercial law is, “that set of rules, from whatever source, which governs international commercial transactions and is… derived from international instruments of various kinds, such as conventions and model laws, and from codification of international trade usage adopted by contract.” See Royston Miles Goode et al., Transnational Commercial Law: Text, Cases, and Materials (Oxford University Press 2015). In relation to ‘transnational data governance’ issues, see Douglas W. Arner et al., The Transnational Data Governance Problem, 37 Berkeley Tech. L.J. 623 (Berkeley Technology Law Journal 2022).

[38] Emmanuelle Ganne & Hannah Nguyen, Standards Toolkit for Cross-Border Paperless Trade: Accelerating Trade Digitalisation Through the Use of Standards (ICC & World Trade Org. 2022).

[39] See Stephen D. Krasner, Structural Causes and Regime Consequences: Regimes as Intervening Variables, 36 International Organization 185 (1982). See also Anu Bradford, Regime Theory, Max Planck Encyclopedia of Public International Law (Oxford University Press 2007). See further Jeswald W. Salacuse, Making transnational law work through regime-buildingin Making Transnational Law Work in the Global Economy 406–430 (Pieter H. F. Bekker, Rudolf Dolzer, & Michael Waibel eds., 2010).

[40] Where business and commercial law have, “grown into a dense thicket of subject-specific branches that govern a broad range of transactions and corporate actions. When one of such dealings or activities falls concurrently within the purview of two or more of these commercial law branches… an overlap materializes… The unharmonious convergence of commercial law branches generates failures in coordination that both increase transaction costs and distort incentives for market participants.” See Giuliano G. Castellano & Andrea Tosato, Commercial Law Intersections, 72 Hastings L.J. (Apr. 2021), https://repository.uchastings.edu/hastings_law_journal/vol72/iss4/2. In advancing the conceptualization of CLIs, see further Douglas W. Arner et al., Financial Data Governance: The Datafication of Finance, the Rise of Open Banking and the End of the Data Centralization Paradigm, 117 University of Hong Kong Faculty of Law Research Paper (Feb. 2022).

[41] The EU has not formally ratified a ‘constitution’ and is ‘constituted’ by treaties and its ‘acquis communautaire’.

[42] The extent of ‘appropriateness’ can be analyzed through dimensions related to discretion, risk, and how ‘practicable’ a rule is. 

[43] The ‘Law + Technology’ approach builds on complexity science and other disciplines / frameworks (e.g., ‘Code / Data as Law’ and ‘Law as Code / Data’) to consider both the issues and positive contributions that technology can bring to society. See Thibault Schrepel, Law + Technology (v2.0), CodeX — The Stan. Ctr. for Legal Informatics Working Paper Series (Jan. 2023).

[44] See Laurence Diver, 3.4.2 Legal Subject, in Text-Driven Normativity (CoHuBiCoL Jul. 2021). In international law, ‘persons’ may be primary (e.g., states, international organizations) or secondary (e.g., corporations, individuals).

[45] The modular assembly of components within any functional system design.

[46] Respectively understood as ‘binding’ and ‘non-binding’ instruments, yet perspectives vary among scholars (e.g., on the nature of enforceability) and across disciplines. See Kenneth W. Abbott & Duncan Snidal, Hard and Soft Law in International Governance, 54 International Organization 421–456 (2000).

[47] Craig Atkinson, Africa’s Potential ‘Born Digital’ Trade Agreement, 1 International Trade Forum 28–29 (International Trade Centre 2019).

[48] Craig Atkinson & Nicolás Schubert, Augmenting MSME Participation in Trade with Policy Digitalisation Efforts: Chile’s Contribution to ‘An Internet of Rules,’ 13 Trade L. & Dev. 80 (2021).

[49] Craig Atkinson & Joseph Potvin, Implementing the African Continental Free Trade Area: A Simple, Scalable, and Fast Computational Approach for Algorithmic Governance, in Sustainable Development in Post-Pandemic Africa: Effective Strategies for Resource Mobilization (Routledge Oct. 2022).

The Digital Markets Act: EU’s Big Policy Promise for Big Tech

By Christine Carter

The EU Digital Market Act (DMA) is the latest piece of the European Commission’s digital reform agenda to create a comprehensive and sophisticated regulatory regime for the Big Tech industry. The DMA was originally proposed in December 2020 and has recently passed a final vote with an overwhelming majority in the European Parliament on the 5th of July 2022 with 588 votes in favor and 11 against the act. The act is expected to be formally adopted by the European Council in October 2022. Following such, big tech companies who are subject to the jurisdiction of the act will have to notify the European Commission within a period of 2 months, starting from Spring 2023, as well as then act in compliance with the DMA’s new regulatory obligations by early 2024. In doing so, the DMA contributes to Europe’s digital reform agenda of “making Europe fit for the Digital Age” and will come into force alongside the EU Digital Services Act (DSA)[1].

Background

The DMA is introduced to deal with the rapid proliferation of the digital economy over the last decade which has resulted from the vast growth of big tech companies.  The digital economy is specifically characterized by the technological control of the so-called “GAFA” sector (Google, Amazon, Facebook and Apple). In response to their increased market share in the EU, there has been a series of legislative challenges and investigations that have been litigated in front of both national and EU courts. One of the most seminal of these legal disputes arose in the investigation of the European Commission[2] against Google in what has become to be known as the Google Shopping case. Consequently, concerns have arisen that European Courts are particularly slow to deal with competition law issues arising from the digital market and lag behind the speed in which the digital economy is evolving. 

Legislative Objective

Against this background, the European Commission introduced the DMA as a series of ex ante obligations that can react quickly and meaningfully to the legal challenges raised by the Big Tech industry. The DMA takes an unprecedented step in shifting from a largely self-regulated to a regulated model of law enforcement in the Big Tech industry.  Margrethe Vestager, Vice President of European Commission described the act as a “global movement” that will “inspire all over the planet[3]. The DMA implements the policy agenda in a series of regulatory obligations for large tech corporations and provides the European Commission with a new set of enforcement powers to take action where those obligations are not met by Big Tech. This regulatory approach seeks to address the current short comings of EU competition law in regulating the digital market and ensuring an equal-level playing field among large tech corporations.

Gatekeepers and Core Platforms Services

To fall within the act’s definition of a gatekeeper, a company must provide one or more of the core platform services defined in Article 2(2) of the DMA and meet a series of qualitative and quantitative criteria which are listed in Article 3(1).

  • A company must have a significant impact on the market: this is presumed where companies have an annual turnover of  € 7.5 billion within the European Economic Area (EEA) or a worldwide market valuation of € 75 billion and it provides the same core platform service in at least three Member States (Articles 3(1)(a) and 3(2)(a));
  • A company must operate one or more important gateways to customers: this is presumed where companies have at least 45 million monthly individual end-users or 100,000 business users located in the EU in the last financial year (Articles 3(1)(b) and 3(2)(b);
  • A company possesses an entrenched and durable position: this is presumed to be met if the meets the above two criteria) in each of the last 3 financial years (Articles 3(1)(c) and 3(2)(c))

Companies that do not meet the quantitative criteria may still be designated as gatekeepers on the basis of a qualitative assessment carried out by the European Commission pursuant to Article 4. Companies that fall within the definition of gatekeepers must comply with the obligations laid out in Articles 5 – 7 and 14 of the DMA. These obligations are roughly split into the following themes: obligations of data protection, device neutrality, transparency in online advertising, ranking neutrality, neutrality towards intermediaries and distributers, and enforcement obligations. The obligations are divided into two different levels of severity. The first type are black list obligations which are directly applicable to gatekeepers without further details. The others are grey list obligations that contain obligations that may be specified in further detail by the Commission following a dialogue with the gatekeeper.

Black List Obligations

These include; prohibiting gatekeepers from processing, combining, signing or cross-using personal data without users’ consent (Article 5(2)), prohibiting gatekeepers from preventing business users from offering products or services through other channels (Article 5(3)), prohibiting anti-steering provisions (Articles 5(4)-(5)), prohibiting restrictions on businesses from raising issues with authorities (Article 5(6)), prohibiting gatekeepers from requiring users to use gatekeepers’ identification or payment services in third party apps or web browsers (Article 5(7)), prohibiting gatekeepers from bundling subscriptions or registrations (Article 5(8)), requiring gatekeepers’ disclosure of advertisements, prices, revenue share information (Articles 5(9)-(10)).

Grey List Obligations

These include prohibiting gatekeepers’ use of users’ data that is not publicly available to compete with business users (Article 6(2)), requiring gatekeepers to allow app uninstallation, changing defaults and choice screens and, allow the installation of third party apps and app stores on their operating systems (Article 6((3)-(4)), prohibiting gatekeepers from deploying of discriminatory rankings against third party services and products (Article 6(5)), prohibiting restrictions on multi-homing (Article 6(6)), requiring interoperability of operating systems and virtual assistants (Article 6(7)), requiring the provision of access to performance measuring tools (Article 6(8)), requiring data portability (Article 6(10)), requiring search data sharing (Article 6(11), requiring fair access to app stores, search engines and social networking services (Article 6(12)), preventing restrictions on the termination of end-users’ use (Article 6(13)), requiring interoperability for messaging services (Article 7).

Notification of Mergers

The DMA also imposes a duty on gatekeepers to inform the European Commission about planned mergers with other platform services or digital entities under Article 14 of the act.  This provision will apply to gatekeepers regardless of whether they are subject to merger controls at either national or international law and is designed to keep the European Commission informed about their market share and aware of any potential killer acquisitions that would create barriers to the entry of the internal market. The DMA therefore allows the European Commission to deal with these issues preemptively without requiring the threshold of Article 22 of the EU Merger Regulation to be met. In so doing, the DMA considerably strengthens the merger control regime of the EU. 

Enforcement

The DMA gives the European Commission several enforcement powers in the event of a gatekeeper’s non-compliance with the aforementioned obligations. The European Commission may impose a fine of up to 10% of the gatekeeper’s worldwide turnover from the previous financial year (or in the case of less serious infringements up to 1%) or additionally may impose a further fine of up to 20% of a gatekeeper’s worldwide turnover from the previous financial year in the event of a repeated violation under Article 30 of the DMA. The imposition of periodic penalty payments is also permitted under Article 31 of the DMA. In the event of systematic non-compliance, the European Commission may also open up an investigation against the gatekeeper. In addition to these sanctions granted to the European Commission, it is expected that individuals may also take private actions against gatekeepers in front of national courts under Article 39 and 42 of the DMA.

Taxonomy within the EU Legal Order

Recital 10 of the DMA explains that the act is without prejudice to Articles 101 and 102 of the TFEU and therefore will operate in parallel to the existing body of EU Competition law and relevant EU Merger Control laws. The hierarchy between the DMA and national member state laws will depend on whether the national law is a regulatory or a competition law. Where the national measure is a regulatory law, the DMA will have the effect of superseding such pursuant to Article 1(5) of the DMA. Where the national measure is a competition law, the DMA does not supersede such subject to the exception in Article 1(6) that applies to national laws that are applied to undertakings other than gatekeepers or amount to the imposition of further obligations on gatekeepers.

Conclusion

The DMA certainly brings a lot of legislative promise to the table, which has been met with a great deal of hope in the legal world. The act itself is extremely detailed and concise and leaves little room for ambiguity. The act also creates a hierarchy between Black and Grey list obligations to enable the European Commission to deal with the former in a manner that is rather strict and with the latter in a manner that is more conducive to further collaboration and cooperation in the resolution of these legal requirements. From this perspective, the DMA symbolizes a fair compromise between the need to protect digital rights and competition interests in a technological world, as well as the need to recognize the political, social and economic reality in which these rights operate, and the ability to strike economic and political compromise where necessary to reach consensus. This is achieved in a manner that is efficient and innovative, which will hopefully provide certainty and clarity on the regulatory status of many Big Tech companies in Europe and avoid elongated legal proceedings in front of national and supranational courts and tribunals in cases of dispute. However, what will remain left to be seen is how effective the act will be on emerging gatekeepers and companies that do not meet the Article 3 requirements. This will be subject to the intensity of the impact assessment review of the European Commission and it will be extremely interesting to see how the EU will go about in exercising its regulatory mandate to subject companies to the DMA.


[1] European Commission (2022), Shaping Europe’s Digital Future

[2] EU Commission, Press release of Nov 30, 2010, Antitrust: Commission probes allegations of antitrust violations by Google

[3] European Parliament, Press Release March 24, 2022, Deal on Digital Markets Act: EU rules to ensure fair competition and more choice for users